perm filename BACKUS[S78,JMC] blob sn#359653 filedate 1978-06-06 generic text, type C, neo UTF8
COMMENT ⊗   VALID 00002 PAGES
C REC  PAGE   DESCRIPTION
C00001 00001
C00002 00002	.require "memo.pub[let,jmc]" source
C00013 ENDMK
C⊗;
.require "memo.pub[let,jmc]" source;
Random Remarks on "Can programming be liberated from the von Neumann
style?  A functional style and its algebra of programs" by John Backus
.item←0


	#. It isn't clear to me that functional programming per se solves
the bottleneck problem.  That is solved by parallelism, which can be
done either functionally or in sequential notation.

	#. It further seems to me that simple sequential programming
with just variables assignments and qgos has one important virtue.
Namely is no information hidden between statements, although the
values of subexpressions are hidden within statements.  It seems to
me that languages having this property will remain important at
least for some theories.  In particular, they will be necessary for some
kinds of description of hardware, because all information preserved
in a computing system has to be somewhere at all times.

	#. Reduction semantics has some disadvantages compared to
model-theoretic semantics.  It requires either non-denumerable
domains of expressions, e.g. expressions containing arbitrary real
numbers, or artificial restrictions to specific symbolic expression
domains for numbers.

	#. "Thirty year old" concept doesn't tell us anything in itself.
No-one has modernized the integers or the concept of four-wheeled
vehicle.  In fact, the idea that the von Neumann style computer is
out-of-date is at least 20 years old and so can be accused of being
out-of-date itself.

	#. It does seem plausible that the von Neumann
style, (I call it sequential programming) creates a bottleneck, but
it is somewhat invidious to introduce a technical term that
implies a personal criticism by its name.  The remark taking some
of the "blame" wouldn't solve the problem, unless the name were to
be changed to "Backus bottleneck".

	#. p8. I agree that the right side of assignment statements
is a more logical world than the left.

	#. It sure is true that pure LISP is often buried (p10), and
I don't yet know whether it is necessary or not.  It is certainly
necessary with LISP in the shape it is.

	#. It doesn't seem that criticism that ⊗n is in the program
is entirely to the point.  ⊗n is as changeable as anything else in
the program.  g is to the point.

	#. I am dubious about the inner product example for two reasons:
First, while the inner product can be computed in the way described,
it shouldn't be, e.g. nothing should be transposed.  Second, since
α corresponds to LISP ⊗mapcar, it has the same limitations, and won't
work when ⊗maplist is wanted, e.g. when the indices need to be tested.
Third, ⊗Insert is a dubious operation, making sense only when the
operation is associative.  Finally, the whole definition makes no
sense for zero length vectors, while the Algol 60 definition gives
the right answer in that case.  You can't get out of zero length
vectors if you want (as Backus demands) that the length be a variable.
A variable may turn out to have the value zero.

	#. In practice, a small framework and completely variable
parts leads to difficulties, because the parts become implementation
dependent or require manuals themselves.  In principle, the creators
of parts have the same obligation to provide manuals as the language
designers, but they usually evade this responsibility.

	#. Damnit, let's call them Backus languages!!

	#. One suspects on p17, that the author isn't very familiar
with much of the literature.  Anyway, the polemical style leads me
to defend sequential programs even though I also prefer applicative
systems.

	#. At the bottom of p20, it looks like the camel has got
his head back in the tent.

	#. On p21, criterion ⊗a proposes doing without variables.
Combinatory logic does it, but the cost in clarity is too great.
I guess I would bet that variables can't be dispensed with in the
long run without making obscure programs.

	#. The goal (p21) of a program algebra is fine, but certain
statements one wants to make require more logic than just algebraic
identities.

	#. On p23 Backus displays the common desire to restrict people
for their own good.  %2Sic semper tyranus%1.

	#. The functional forms on p24 appear to be higher order
functions.  They are constants and have names.  First level functions
are constructed from basic functions and are given names.  The objects
appear to have only limited capacity to be named.

	#. Objects appear on p25 to be LISP type lists.  He gives up
unrestricted ⊗cons.  Using the numbers as operators in this context
is an improvement.  It was first suggested by Scott.

	#. Note on p26 that while the language has no variables, it
has been convenient to use variables in making definitions.
The poor users cry, "Why can't we have variables too?".

	#. On p26, he will be sorry he said that the function symbols
denoted by sans serif numerals are different from the serifed numerals.
What if we want the %2n%1th element of a list where ⊗n is computed?

	#. The definitional style on p26 is convenient.  It depends
on the reader's ability to decide that free variables are used in
patterns that must be matched.  Will his users get some too?

	#. On p27, ⊗reverse appears as a basic function.  If this
is necessary or even especially convenient, something is rotten.

	#. The logical operators are taken to be strict.  He never
said whether conditional expressions are strict.

	#. ⊗apndl is ⊗cons.

	#. Well, that's a lot more basic functions than LISP has.

	#. Using ⊗x with a bar to denote the constant function means
that the constant maker itself cannot be composed.

	#. A comparison with combinatory logic with its K and S
combinators is called for.

	#. ⊗construction is interesting, because it is not a
simple combinator.  It depends on its object argument being a
list of the right length and so is not in the spirit of combinatory
logic.

	#. On p29 it becomes clear that conditional expressions are
not considered strict.

	#. ⊗insert associates to the right.  A recursive definition
by gum.  Why can't the users have them?  Most likely because they
can't be trusted.

	#. %2binary to unary%1 looks ad hoc.  Why not %2ternary to binary%1?

	#. ⊗while allows full recursive definition but of the iterative
kind only.  It won't give ⊗n! directly, but I know he gets the recursive
factorial late.

	#. On p30 LISP type recursive definitions are explicitly
introduced.  Is there mutual recursion and is there a way of naming
recursive functions analogous to ⊗label.  If not he can't use them
freely in his functional forms like α (⊗mapcar).

	#. The definition of ⊗well ⊗formed at the bottom of p.30
hints at mutual recursion.  The condition is rather vague.  Well
maybe not since only single symbols can appear on left sides.

	#. With FP completed on p31, I wonder how one writes a polynomial
such as λx.(x↑2 + 2x + 3)(x↑2 + 4).